665 research outputs found

    Graphical Reasoning in Compact Closed Categories for Quantum Computation

    Full text link
    Compact closed categories provide a foundational formalism for a variety of important domains, including quantum computation. These categories have a natural visualisation as a form of graphs. We present a formalism for equational reasoning about such graphs and develop this into a generic proof system with a fixed logical kernel for equational reasoning about compact closed categories. Automating this reasoning process is motivated by the slow and error prone nature of manual graph manipulation. A salient feature of our system is that it provides a formal and declarative account of derived results that can include `ellipses'-style notation. We illustrate the framework by instantiating it for a graphical language of quantum computation and show how this can be used to perform symbolic computation.Comment: 21 pages, 9 figures. This is the journal version of the paper published at AIS

    Inadequacy of zero-width approximation for a light Higgs boson signal

    Get PDF
    In the Higgs search at the LHC, a light Higgs boson (115 GeV <~ M_H <~ 130 GeV) is not excluded by experimental data. In this mass range, the width of the Standard Model Higgs boson is more than four orders of magnitude smaller than its mass. The zero-width approximation is hence expected to be an excellent approximation. We show that this is not always the case. The inclusion of off-shell contributions is essential to obtain an accurate Higgs signal normalisation at the 1% precision level. For gg (-> H) -> VV, V= W,Z, O(10%) corrections occur due to an enhanced Higgs signal in the region M_VV > 2 M_V, where also sizable Higgs-continuum interference occurs. We discuss how experimental selection cuts can be used to exclude this region in search channels where the Higgs invariant mass cannot be reconstructed. We note that the H -> VV decay modes in weak boson fusion are similarly affected.Comment: 26 pages, 18 figures, 6 tables; added references, expanded introduction, version to appear in JHE

    Optimality-based Analysis of XCSF Compaction in Discrete Reinforcement Learning

    Full text link
    Learning classifier systems (LCSs) are population-based predictive systems that were originally envisioned as agents to act in reinforcement learning (RL) environments. These systems can suffer from population bloat and so are amenable to compaction techniques that try to strike a balance between population size and performance. A well-studied LCS architecture is XCSF, which in the RL setting acts as a Q-function approximator. We apply XCSF to a deterministic and stochastic variant of the FrozenLake8x8 environment from OpenAI Gym, with its performance compared in terms of function approximation error and policy accuracy to the optimal Q-functions and policies produced by solving the environments via dynamic programming. We then introduce a novel compaction algorithm (Greedy Niche Mass Compaction - GNMC) and study its operation on XCSF's trained populations. Results show that given a suitable parametrisation, GNMC preserves or even slightly improves function approximation error while yielding a significant reduction in population size. Reasonable preservation of policy accuracy also occurs, and we link this metric to the commonly used steps-to-goal metric in maze-like environments, illustrating how the metrics are complementary rather than competitive

    The use of abrasive polishing and laser processing for developing polyurethane surfaces for controlling fibroblast cell behaviour

    Get PDF
    Studies have shown that surfaces having micro and nano-scale features can be used to control cell behaviours including; cell proliferation, migration and adhesion. The aim of this work was to compare the use of laser processing and abrasive polishing to develop micro/nano-patterned polyurethane substrates for controlling fibroblast cell adhesion, migration and proliferation. Laser processing in a directional manner resulted in polyurethane surfaces having a ploughed field effect with micron-scale features. In contrast, abrasive polishing in a directional and random manner resulted in polyurethane surfaces having sub-micron scale features orientated in a linear or random manner. Results show that when compared with flat (non-patterned) polymer, both the laser processed and abrasive polished surface having randomly organised features, promoted significantly greater cell adhesion, while also enhancing cell proliferation after 72 h. In contrast, the abrasive polished surface having linear features did not enhance cell adhesion or proliferation when compared to the flat surface. For cell migration, the cells growing on the laser processed and abrasively polished random surface showed decreased levels of migration when compared to the flat surface. This study shows that both abrasive polishing and laser processing can be used to produce surfaces having features on the nano-scale and micron-scale, respectively. Surfaces produced using both techniques can be used to promote fibroblast cell adhesion and proliferation. Thus both methods offer a viable alternative to using lithographic techniques for developing patterned surfaces. In particular, abrasive polishing is an attractive method due to it being a simple, rapid and inexpensive method that can be used to produce surfaces having features on a comparable scale to more expensive, multi-step methods

    Next-to-leading order QCD predictions for Z0H0+jetZ^0 H^0 + {\rm jet} production at LHC

    Full text link
    We calculate the complete next-to-leading order (NLO) QCD corrections to the Z0H0Z^0H^0 production in association with a jet at the LHC. We study the impacts of the NLO QCD radiative corrections to the integrated and differential cross sections and the dependence of the cross section on the factorization/renormalization scale. We present the transverse momentum distributions of the final Z0Z^0-, Higgs-boson and leading-jet. We find that the NLO QCD corrections significantly modify the physical observables, and obviously reduce the scale uncertainty of the LO cross section. The QCD K-factors can be 1.183 and 1.180 at the s=14TeV\sqrt{s}=14 TeV and s=7TeV\sqrt{s}=7 TeV LHC respectively, when we adopt the inclusive event selection scheme with pT,jcut=50GeVp_{T,j}^{cut}=50 GeV, mH=120GeVm_H=120 GeV and μ=μr=μf=μ01/2(mZ+mH)\mu=\mu_r=\mu_f=\mu_0 \equiv 1/2(m_Z+m_H). Furthermore, we make the comparison between the two scale choices, μ=μ0\mu=\mu_0 and μ=μ1=1/2(ETZ+ETH+jETjet)\mu=\mu_1=1/2(E_{T}^{Z}+E_{T}^{H}+ \sum_{j}E_{T}^{jet}), and find the scale choice μ=μ1\mu=\mu_1 seems to be more appropriate than the fixed scale μ=μ0\mu=\mu_0.Comment: 18 pages, 7 figure

    Prospective memory functioning among ecstasy/polydrug users: evidence from the Cambridge Prospective Memory Test (CAMPROMPT)

    Get PDF
    Rationale: Prospective memory (PM) deficits in recreational drug users have been documented in recent years. However, the assessment of PM has largely been restricted to self-reported measures that fail to capture the distinction between event-based and time-based PM. The aim of the present study is to address this limitation. Objectives: Extending our previous research, we augmented the range laboratory measures of PM by employing the CAMPROMPT test battery to investigate the impact of illicit drug use on prospective remembering in a sample of cannabis only, ecstasy/polydrug and non-users of illicit drugs, separating event and time-based PM performance. We also administered measures of executive function and retrospective memory in order to establish whether ecstasy/polydrug deficits in PM were mediated by group differences in these processes. Results: Ecstasy/polydrug users performed significantly worse on both event and time-based prospective memory tasks in comparison to both cannabis only and non-user groups. Furthermore, it was found that across the whole sample, better retrospective memory and executive functioning was associated with superior PM performance. Nevertheless, this association did not mediate the drug-related effects that were observed. Consistent with our previous study, recreational use of cocaine was linked to PM deficits. Conclusions: PM deficits have again been found among ecstasy/polydrug users, which appear to be unrelated to group differences in executive function and retrospective memory. However, the possibility that these are attributable to cocaine use cannot be excluded

    Phase II study of the oxygen saturation curve left shifting agent BW12C in combination with the hypoxia activated drug mitomycin C in advanced colorectal cancer

    Get PDF
    BW12C (5-[2-formyl-3-hydroxypenoxyl] pentanoic acid) stabilizes oxyhaemoglobin, causing a reversible left-shift of the oxygen saturation curve (OSC) and tissue hypoxia. The activity of mitomycin C (MMC) is enhanced by hypoxia. In this phase II study, 17 patients with metastatic colorectal cancer resistant to 5-fluorouracil (5-FU) received BW12C and MMC. BW12C was given as a bolus loading dose of 45 mg kg−1over 1 h, followed by a maintenance infusion of 4 mg kg−1h−1for 5 h. MMC 6 mg m−2was administered over 15 min immediately after the BW12C bolus. The 15 evaluable patients had progressive disease after a median of 2 (range 1–4) cycles of chemotherapy. Haemoglobin electrophoresis 3 and 5 h after the BW12C bolus dose showed a fast moving band consistent with the BW12C-oxyhaemoglobin complex, accounting for approximately 50% of total haemoglobin. The predominant toxicities – nausea/vomiting and vein pain – were mild and did not exceed CTC grade 2. Liver31P magnetic resonance spectroscopy of patients with hepatic metastases showed no changes consistent with tissue hypoxia. The principle of combining a hypoxically activated drug with an agent that increases tissue hypoxia is clinically feasible, producing an effect equivalent to reducing tumour oxygen delivery by at least 50%. However, BW12C in combination with MMC for 5-FU-resistant colorectal cancer is not an effective regimen. This could be related to drug resistance rather than a failure to enhance cytotoxicity. © 2000 Cancer Research Campaig

    Precision Gauge Unification from Extra Yukawa Couplings

    Full text link
    We investigate the impact of extra vector-like GUT multiplets on the predicted value of the strong coupling. We find in particular that Yukawa couplings between such extra multiplets and the MSSM Higgs doublets can resolve the familiar two-loop discrepancy between the SUSY GUT prediction and the measured value of alpha_3. Our analysis highlights the advantages of the holomorphic scheme, where the perturbative running of gauge couplings is saturated at one loop and further corrections are conveniently described in terms of wavefunction renormalization factors. If the gauge couplings as well as the extra Yukawas are of O(1) at the unification scale, the relevant two-loop correction can be obtained analytically. However, the effect persists also in the weakly-coupled domain, where possible non-perturbative corrections at the GUT scale are under better control.Comment: 26 pages, LaTeX. v6: Important early reference adde

    Predictions for Higgs production at the Tevatron and the associated uncertainties

    Get PDF
    We update the theoretical predictions for the production cross sections of the Standard Model Higgs boson at the Fermilab Tevatron collider, focusing on the two main search channels, the gluon-gluon fusion mechanism ggHgg \to H and the Higgs-strahlung processes qqˉVHq \bar q \to VH with V=W/ZV=W/Z, including all relevant higher order QCD and electroweak corrections in perturbation theory. We then estimate the various uncertainties affecting these predictions: the scale uncertainties which are viewed as a measure of the unknown higher order effects, the uncertainties from the parton distribution functions and the related errors on the strong coupling constant, as well as the uncertainties due to the use of an effective theory approach in the determination of the radiative corrections in the ggHgg \to H process at next-to-next-to-leading order. We find that while the cross sections are well under control in the Higgs--strahlung processes, the theoretical uncertainties are rather large in the case of the gluon-gluon fusion channel, possibly shifting the central values of the next-to-next-to-leading order cross sections by more than 40\approx 40%. These uncertainties are thus significantly larger than the 10\approx 10% error assumed by the CDF and D0 experiments in their recent analysis that has excluded the Higgs mass range MH=M_H=162-166 GeV at the 95% confidence level. These exclusion limits should be, therefore, reconsidered in the light of these large theoretical uncertainties.Comment: 40 pages, 12 figures. A few typos are corrected and some updated numbers are provide

    Beyond the required LISA free-fall performance: new LISA pathfinder results down to 20  μHz

    Get PDF
    In the months since the publication of the first results, the noise performance of LISA Pathfinder has improved because of reduced Brownian noise due to the continued decrease in pressure around the test masses, from a better correction of noninertial effects, and from a better calibration of the electrostatic force actuation. In addition, the availability of numerous long noise measurement runs, during which no perturbation is purposely applied to the test masses, has allowed the measurement of noise with good statistics down to 20  μHz. The Letter presents the measured differential acceleration noise figure, which is at (1.74±0.05)  fm s^{-2}/sqrt[Hz] above 2 mHz and (6±1)×10  fm s^{-2}/sqrt[Hz] at 20  μHz, and discusses the physical sources for the measured noise. This performance provides an experimental benchmark demonstrating the ability to realize the low-frequency science potential of the LISA mission, recently selected by the European Space Agency
    corecore